IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Entropy Bounds for Discrete Random Varibles via Coupling
نویسنده
چکیده
This paper derives new entropy bounds for discrete random variables via maximal coupling. It provides bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions. These bounds address cases of finite or countable infinite alphabets. Particular cases of these bounds reproduce some known results. The use of the new entropy bounds is exemplified by relying on some bounds on the above distances via Stein’s method. The improvement that is obtained by these bounds is exemplified.
منابع مشابه
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Tightened Exponential Bounds for Discrete Time, Conditionally Symmetric Martingales with Bounded Jumps
This letter derives some new exponential bounds for discrete time, real valued, conditionally symmetric martingales with bounded jumps. The new bounds are extended to conditionally symmetric sub/ supermartingales, and are compared to some existing bounds.
متن کاملIRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Improved Lower Bounds on the Total Variation Distance for the Poisson Approximation
New lower bounds on the total variation distance between the distribution of a sum of independent Bernoulli random variables and the Poisson random variable (with the same mean) are derived via the Chen-Stein method. The new bounds rely on a non-trivial modification of the analysis by Barbour and Hall (1984) which surprisingly gives a significant improvement. A use of the new lower bounds is ad...
متن کاملIRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences
Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...
متن کاملIRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Erasure/List Random Coding Error Exponents Are Not Universally Achievable
متن کامل
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES An Information-Theoretic Perspective of the Poisson Approximation via the Chen- Stein Method
The first part of this work considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012